48 research outputs found

    Chaos and Learning in Discrete-Time Neural Networks

    Get PDF
    We study a family of discrete-time recurrent neural network models in which the synaptic connectivity changes slowly with respect to the neuronal dynamics. The fast (neuronal) dynamics of these models display a wealth of behaviors ranging from simple convergence and oscillation to chaos, and the addition of slow (synaptic) dynamics which mimic the biological mechanisms of learning and memory induces complex multiscale dynamics which render rigorous analysis quite difficult. Nevertheless, we prove a general result on the interplay of these two dynamical timescales, demarcating a regime of parameter space within which a gradual dampening of chaotic neuronal behavior is induced by a broad class of learning rules

    Chaos and Learning in Discrete-Time Neural Networks

    Get PDF
    We study a family of discrete-time recurrent neural network models in which the synaptic connectivity changes slowly with respect to the neuronal dynamics. The fast (neuronal) dynamics of these models display a wealth of behaviors ranging from simple convergence and oscillation to chaos, and the addition of slow (synaptic) dynamics which mimic the biological mechanisms of learning and memory induces complex multiscale dynamics which render rigorous analysis quite difficult. Nevertheless, we prove a general result on the interplay of these two dynamical timescales, demarcating a regime of parameter space within which a gradual dampening of chaotic neuronal behavior is induced by a broad class of learning rules

    Information-theoretic bounds and phase transitions in clustering, sparse PCA, and submatrix localization

    Full text link
    We study the problem of detecting a structured, low-rank signal matrix corrupted with additive Gaussian noise. This includes clustering in a Gaussian mixture model, sparse PCA, and submatrix localization. Each of these problems is conjectured to exhibit a sharp information-theoretic threshold, below which the signal is too weak for any algorithm to detect. We derive upper and lower bounds on these thresholds by applying the first and second moment methods to the likelihood ratio between these "planted models" and null models where the signal matrix is zero. Our bounds differ by at most a factor of root two when the rank is large (in the clustering and submatrix localization problems, when the number of clusters or blocks is large) or the signal matrix is very sparse. Moreover, our upper bounds show that for each of these problems there is a significant regime where reliable detection is information- theoretically possible but where known algorithms such as PCA fail completely, since the spectrum of the observed matrix is uninformative. This regime is analogous to the conjectured 'hard but detectable' regime for community detection in sparse graphs.Comment: For sparse PCA and submatrix localization, we determine the information-theoretic threshold exactly in the limit where the number of blocks is large or the signal matrix is very sparse based on a conditional second moment method, closing the factor of root two gap in the first versio

    Comparison of Identified Teaching Success Characteristics of Adult Vocational Teachers with their Attitude Inventory Scores

    Get PDF
    Agricultural Educatio
    corecore